1
Dynamics of Randomness and Information
MATH005 Lesson 9
00:00
Imagine a world where the future is not a fixed path, but a shimmering web of possibilities. To master the Dynamics of Randomness is to bridge the gap between stochastic evolution—how systems migrate across states—and the quantification of the "newness" or surprise inherent in those transitions.

1. The Architecture of State Transitions

Consider the logic of weather. If we assume today's rain is the only factor influencing tomorrow, we enter the realm of Markovian dynamics. This is elegantly captured in EXAMPLE 2a:

Suppose that whether it rains tomorrow depends on previous weather conditions only through whether it is raining today. If it rains today, it rains tomorrow with probability $\alpha$; if not, it rains tomorrow with probability $\beta$.

This creates a transition matrix $P$ where we can calculate the future probability flow using the Chapman-Kolmogorov Identity:

$$P_{ij}^{(2)} = \sum_{k=0}^{M} P_{kj}P_{ik}$$

2. The Rhythm of Arrival

Randomness is not just about where we go, but when events occur. In a Poisson process, we track discrete arrivals (like earthquakes or radioactive decay) over time.

  • Interarrival Times: For a Poisson process, let $T_1$ denote the time the first event occurs. For $n > 1$, let $T_n$ denote the time elapsed between the $(n-1)$st and the $n$th event.
  • Stationarity: The sequence $\{T_n, n=1, 2, \ldots\}$ consists of independent exponential variables, dictated by the rate $\lambda$.

3. Information as the Reduction of Surprise

Information theory, pioneered by Claude Shannon, quantifies uncertainty. It rests on a beautiful algebraic foundation, specifically Axiom 4:

Axiom 4: $S(pq) = S(p) + S(q)$ for $0 < p \le 1, 0 < q \le 1$

This axiom implies that the surprise of two independent events is the sum of their individual surprises, leading directly to the definition of Shannon Entropy:

$$H(X) = -\sum_{i=1}^n p_i \log_2(p_i)$$

🎯 Core Insight
Dynamics define the rules of the game (Transition Probabilities), while Entropy measures how much we learn by actually playing the game (Information Gain). If $\alpha=1$ and $\beta=1$ in our weather model, the system is deterministic; the entropy is zero because the "news" provides no new information.